Web Survey Bibliography
There is a lot of debate about whether questions should be presented on a grid or in a single item per screen. Operationally, grids take less time for respondents to complete. Use
of grids should decrease response burden, although new research shows that respondents seem to prefer a single item per screen. From a measurement point of view, grids pose numerous issues: higher item non-response, higher item non-differentiation, and sometimes higher measurement error.
In this experiment, we are testing the Vitality (4 items) and Mental Health (5 items) scales of the SF-36v2® Health Survey. The SF-36v2 asks 36 questions to measure functional
health and well-being from the patient's point of view. It is called a generic health survey, because it can be used across age (18 and older), disease, and treatment groups, as opposed to a disease-specific health survey which focuses on a particular condition or disease. Two of the four items of the vitality scale and two out of five items of the mental health scale are reversed in scoring.
A sample of 2,500 KnowledgePanel® respondents was randomly assigned to one of five experimental conditions: Group 1: Standard grid; Group 2: Shaded grid; Group 3: One item per screen with horizontal response options; Group 4: One item per screen with vertical response options; Group 5: One item per screen with vertical shaded response options. Approximately 360 respondents completed the survey per condition for a completion rate of 73.4%. The survey was optimized to be seen on a screen with minimum resolution of 800 by 600 pixels. During the study we collected the browser type for each respondent. This allowed us to exclude cases in which the survey was taken either on a MSNTV or on an iPhone/PDA because they could not properly see the grid items. The final sample used for the analysis, after exclusions, was of 1,419 cases for an average group size of about 280.
We hypothesized that items presented on a grid would lead to more measurement error as indicated by a higher rate of “inconsistencies” in the self-reports to grid questions and a lower rate of inconsistencies in the self-reports to the single-item questions. We speculated that presenting items on a single screen allows the respondent to bring more cognitive focus to each question and therefore be more consistent in their answers to questions. In contrast, when items are on a grid, it is easier for the respondent to get confused, especially when the meaning of some of the items is reversed. We computed an index of consistency by correlating the total sum of scores for the reversed items with the total sum of scores for the non-reversed items. If respondents are consistent in their answers the correlation between reversed and non-reversed should be higher. We calculated Cronbach's alpha scores to measure consistency in answers for each of the five experimental conditions.
The direction of the study findings were consistent with our hypotheses -- lower alpha level for the grid presentation and higher correlation for the single-item presentation -- although the differences among groups do not reach statistical significance.
Homepage (abstract)/(full text)
Web survey bibliography (367)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.
- Analyzing Survey Characteristics, Participation, and Evaluation Across 186 Surveys in an Online Opt-...; 2017; Revilla, M.
- Careless Response and Attrition as Sources of Bias in Online Survey Assessments of Personality Traits...; 2017; Meade, A. W.; Ward, M. K.; Alfred, C. M.; Pappalardo, G.; Stoughton, J. W.
- Do Incentives Increase Response Rates to an Internet Survey of American Evaluation Association Members...; 2017; Wilson, L. N.
- Examining Completion Rates in Web Surveys via Over 25,000 Real-World Surveys; 2017; Liu, M.; Wronski, L.
- Data collection mode differences between national face-to-face and web surveys on gender inequality...; 2017; Liu, M.
- Improving survey response rates: The effect of embedded questions in web survey email Invitations; 2017; Liu, M.; Inchausti, N.
- An experimental comparison of web-push vs. paper-only survey procedures for conducting an in-depth health...; 2017; McMaster, H. S.; LeardMann, C. A.; Speigle, S.; Dillman, D. A.
- Demographic Question Placement: Effect on Item Response Rates and Means of a Veterans Health Administration...; 2017; Teclaw, R.; Price, M.; Osatuke, K.
- Effects of Applying Multimedia and Dialogue Box to Web Survey Design; 2017; Chen, H.
- Role of online survey tools in creating temporally accurate Environmental Product Declarations (EPD)...; 2017; Ganguly, I.; Bowers, T.; Pierobon, F.; Eastin, I.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Grundzüge des Datenschutzrechts und aktuelle Datenschutzprobleme in der Markt- und Sozialforschung; 2017; Schweizer, A.
- Data chunking for mobile web: effects on data quality; 2017; Lugtig, P. J.; Toepoel, V.
- Comparing data quality and cost from three modes of on-board transit surveys ; 2017; Agrawal, A. W.; Granger-Bevan, S.; W.; Newmark, G. L.; Nixon, H.
- Finding and Investigating Geographical Data Online; 2017; Martin, D.; Cockings, S.; Leung, S.
- Three Methods for Occupation Coding Based on Statistical Learning; 2017; Geweon, H.; Schonlau, L.; Blohum, M.; Steiner, St.
- Dynamic Question Ordering in Online Surveys; 2016; Early, K.; Mankoff, J.; Fienberg, S. E.
- How to use online surveys to understand human behaviour concerning window opening in terms of building...; 2016; Fabbri, K.
- Impact of satisficing behavior in online surveys on consumer preference and welfare estimates; 2016; Gao, Z.; House, L. A.; Bi, X.
- Targeted Appeals for Participation in Letters to Panel Survey Members; 2016; Lynn, P.
- Can we assess representativeness of cross-national surveys using the education variable?; 2016; Ortmanns, V.; Schneider, S.
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Fieldwork Effort, Response Rate, and the Distribution of Survey Outcomes: A Multilevel Meta-analysis; 2016; Sturgis, P.; Williams, Jo.; Brunton-Smith, I.; Moore, J.
- Comparison of Face-to-Face and Web Surveys on the Topic of Homosexual Rights; 2016; Liu, M.; Wang, Yic.
- Question order sensitivity of subjective well-being measures: focus on life satisfaction, self-rated...; 2016; Lee, S.; McClain, C.; Webster, N.; Han, S.
- Web-Based Statistical Sampling and Analysis; 2016; Quinn, A.; Larson, K.